Estimating the Renyi entropy of several exponential populations

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Renyi Entropy Estimation Revisited

We revisit the problem of estimating entropy of discrete distributions from independent samples, studied recently by Acharya, Orlitsky, Suresh and Tyagi (SODA 2015), improving their upper and lower bounds on the necessary sample size n. For estimating Renyi entropy of order α, up to constant accuracy and error probability, we show the following Upper bounds n = O(1) · 2(1− 1 α )Hα for integer α...

متن کامل

Shannon Entropy , Renyi Entropy , and Information

This memo contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the Tsallis entropy, or information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures. A brief summary of the origins of the concept of physical entropy are provided in an appendix.

متن کامل

Testing Exponentiality Based on Renyi Entropy of Transformed Data

In this paper, we introduce new tests for exponentiality based on estimators of Renyi entropy of a continuous random variable. We first consider two transformations of the observations which turn the test of exponentiality into one of uniformity and use a corresponding test based on Renyi entropy. Critical values of the test statistics are computed by Monte Carlo simulations. Then, we compare p...

متن کامل

Shannon and Renyi Entropy of Wavelets

This paper reports a new reading for wavelets, which is based on the classical ’De Broglie’ principle. The waveparticle duality principle is adapted to wavelets. Every continuous basic wavelet is associated with a proper probability density, allowing defining the Shannon entropy of a wavelet. Further entropy definitions are considered, such as Jumarie or Renyi entropy of wavelets. We proved tha...

متن کامل

A Comprehensive Comparison of Shannon Entropy and Smooth Renyi Entropy

We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision entropy H2. Our formula gives the worst possible amount of collision entropy in a probability distribution, when its Shannon Entropy is fixed. Our results and techniques used in the proof immediately imply many quantitatively tight separations between Shannon and smooth Renyi entropy, which were pre...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Brazilian Journal of Probability and Statistics

سال: 2015

ISSN: 0103-0752

DOI: 10.1214/13-bjps230